Rectified Gaussian Scale Mixtures and the Sparse Non-Negative Least Squares Problem

نویسندگان

  • Alican Nalci
  • Igor Fedorov
  • Bhaskar D. Rao
چکیده

In this paper, we develop a Bayesian evidence maximization framework to solve the sparse non-negative least squares problem (S-NNLS). We introduce a family of scale mixtures referred as to Rectified Gaussian Scale Mixture (RGSM) to model the sparsity enforcing prior distribution for the signal of interest. Through proper choice of the mixing density, the R-GSM prior encompasses a wide variety of heavy-tailed distributions such as the rectified Laplacian and rectified Studentt distributions. Utilizing the hierarchical representation induced by the scale mixture prior, an evidence maximization or Type II estimation method based on the expectation-maximization (EM) framework is developed to estimate the hyper-parameters and to obtain a point estimate of the parameter of interest. In the proposed method, called rectified Sparse Bayesian Learning (RSBL), we provide four alternative approaches that offer a range of options to trade-off computational complexity to quality of the E-step computation. The methods include the Markov Chain Monte Carlo EM, linear minimum mean square estimation, approximate message passing and a diagonal approximation. Through numerical experiments, we show that the proposed RSBL method outperforms existing S-NNLS solvers in terms of both signal and support recovery.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Positive solution of non-square fully Fuzzy linear system of equation in general form using least square method

In this paper, we propose the least-squares method for computing the positive solution of a $mtimes n$ fully fuzzy linear system (FFLS) of equations, where $m > n$, based on Kaffman's arithmetic operations on fuzzy numbers that introduced in [18]. First, we consider all elements of coefficient matrix are non-negative or non-positive. Also, we obtain 1-cut of the fuzzy number vector solution of ...

متن کامل

Large-scale Inversion of Magnetic Data Using Golub-Kahan Bidiagonalization with Truncated Generalized Cross Validation for Regularization Parameter Estimation

In this paper a fast method for large-scale sparse inversion of magnetic data is considered. The L1-norm stabilizer is used to generate models with sharp and distinct interfaces. To deal with the non-linearity introduced by the L1-norm, a model-space iteratively reweighted least squares algorithm is used. The original model matrix is factorized using the Golub-Kahan bidiagonalization that proje...

متن کامل

A New Technique for Image Zooming Based on the Moving Least Squares

In this paper, a new method for gray-scale image and color zooming algorithm based on their local information is offered. In the proposed method, the unknown values of the new pixels on the image are computed by Moving Least Square (MLS) approximation based on both the quadratic spline and Gaussian-type weight functions. The numerical results showed that this method is more preferable to biline...

متن کامل

Sparse Bayes estimation in non-Gaussian models via data augmentation

In this paper we provide a data-augmentation scheme that unifies many common sparse Bayes estimators into a single class. This leads to simple iterative algorithms for estimating the posterior mode under arbitrary combinations of likelihoods and priors within the class. The class itself is quite large: for example, it includes quantile regression, support vector machines, and logistic and multi...

متن کامل

Iteratively re-weighted least-squares and PEF-based interpolation

Interpolation methods frequently deal poorly with noise. Least-squares based interpolation methods can deal well with noise, as long as it is Gaussian and zero-mean. When this is not the case, other methods are needed. I use an iteratively-reweighted least-squares scheme to interpolate both regular and sparse data with non-stationary prediction-error filters. I show that multi-scale methods are...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:
  • CoRR

دوره abs/1601.06207  شماره 

صفحات  -

تاریخ انتشار 2016